140 research outputs found
Recommended from our members
Why Markets Could (But Don't Currently) Solve Resource Allocation Problems in Systems
Using market mechanisms for resource allocation in distributed systems is not a new idea, nor is it one that has caught on in practice or with a large body of computer science research. Yet, projects that use markets for distributed resource allocation recur every few years [1, 2, 3], and a new generation of research is exploring market-based resource allocation mechanisms [4, 5, 6, 7, 8] for distributed environments such as Planetlab, Netbed, and computational grids.
This paper has three goals. The first goal is to explore why markets can be appropriate to use for allocation, when simpler allocation mechanisms exist. The second goal is to demonstrate why a new look at markets for allocation could be timely, and not a re-hash of previous research. The third goal is to point out some of the thorny problems inherent in market deployment and to suggest action items both for market designers and for the greater research community. We are optimistic about the power of market design, but we also believe that key challenges exist for a markets/systems integration that must be overcome for market-based computer resource allocation systems to succeed.Engineering and Applied Science
Recommended from our members
Mirage: A Microeconomic Resource Allocation System for Sensornet Testbeds
In this paper, we argue that a microeconomic resource allocation scheme, specifically the combinatorial auction, is well suited to testbed resource management. To demonstrate this, we present the Mirage resource allocation system. In Mirage, testbed resources are allocated using a repeated combinatorial auction within a closed virtual currency environment. Users compete for testbed resources by submitting bids which specify resource combinations of interest in space/time (e.g., "any 32 MICA2 motes for 8 hours anytime in the next three days") along with a maximum value amount the user is willing to pay. A combinatorial auction is then periodically run to determine the winning bids based on supply and demand while maximizing aggregate utility delivered to users. We have implemented a fully functional and secure prototype of Mirage and have been operating it in daily use for approximately four months on Intel Research Berkeley's 148-mote sensornet testbed.Engineering and Applied Science
Recommended from our members
Two Auction-Based Resource Allocation Environments: Design and Experience
Many computer systems have reached the point where the goal of resource
allocation is no longer to maximize utilization; instead, when demand
exceeds supply and not all needs can be met, one needs a policy to guide
resource allocation decisions. One natural policy is to seek efficient usage,
which allocates resources to the set of users who have the highest utility for
the use of the resources. Researchers have frequently proposed market-based
mechanisms to provide such a goal-oriented way to allocate resources
among competing interests while maximizing overall utility of the users.Engineering and Applied Science
Exposing Inconsistent Web Search Results with Bobble
Abstract. Given their critical role as gateways to Web content, the search results a Web search engine provides to its users have an out-sized impact on the way each user views the Web. Previous studies have shown that popular Web search engines like Google employ sophisticated personalization engines that can oc-casionally provide dramatically inconsistent views of the Web to different users. Unfortunately, even if users are aware of this potential, it is not straightforward for them to determine the extent to which a particular set of search results differs from those returned to other users, nor the factors that contribute to this person-alization. We present the design and implementation of Bobble, a Web browser extension that contemporaneously executes a user’s Google search query from a variety of different world-wide vantage points under a range of different condi-tions, alerting the user to the extent of inconsistency present in the set of search results returned to them by Google. Using more than 75,000 real search queries issued by over 170 users during a nine-month period, we explore the frequency and nature of inconsistencies that arise in Google search queries. In contrast to previously published results, we find that 98 % of all Google search results dis-play some inconsistency, with a user’s geographic location being the dominant factor influencing the nature of the inconsistency.
Inferring persistent interdomain congestion
There is significant interest in the technical and policy communities regarding the extent, scope, and consumer harm of persistent interdomain congestion. We provide empirical grounding for discussions of interdomain congestion by developing a system and method to measure congestion on thousands of interdomain links without direct access to them. We implement a system based on the Time Series Latency Probes (TSLP) technique that identifies links with evidence of recurring congestion suggestive of an under-provisioned link. We deploy our system at 86 vantage points worldwide and show that congestion inferred using our lightweight TSLP method correlates with other metrics of interconnection performance impairment. We use our method to study interdomain links of eight large U.S. broadband access providers from March 2016 to December 2017, and validate our inferences against ground-truth traffic statistics from two of the providers. For the period of time over which we gathered measurements, we did not find evidence of widespread endemic congestion on interdomain links between access ISPs and directly connected transit and content providers, although some such links exhibited recurring congestion patterns. We describe limitations, open challenges, and a path toward the use of this method for large-scale third-party monitoring of the Internet interconnection ecosystem
- …